Search results for "evolutionary computation"
showing 10 items of 113 documents
SSPMO: A Scatter Tabu Search Procedure for Non-Linear Multiobjective Optimization
2007
We describe the development and testing of a metaheuristic procedure, based on the scatter-search methodology, for the problem of approximating the efficient frontier of nonlinear multiobjective optimization problems with continuous variables. Recent applications of scatter search have shown its merit as a global optimization technique for single-objective problems. However, the application of scatter search to multiobjective optimization problems has not been fully explored in the literature. We test the proposed procedure on a suite of problems that have been used extensively in multiobjective optimization. Additional tests are performed on instances that are an extension of those consid…
An evolutionary restricted neighborhood search clustering approach for PPI networks
2014
Protein-protein interaction networks have been broadly studied in the last few years, in order to understand the behavior of proteins inside the cell. Proteins interacting with each other often share common biological functions or they participate in the same biological process. Thus, discovering protein complexes made of a group of proteins strictly related can be useful to predict protein functions. Clustering techniques have been widely employed to detect significant biological complexes. In this paper, we integrate one of the most popular network clustering techniques, namely the Restricted Neighborhood Search Clustering (RNSC), with evolutionary computation. The two cost functions intr…
Multilayer neural networks: an experimental evaluation of on-line training methods
2004
Artificial neural networks (ANN) are inspired by the structure of biological neural networks and their ability to integrate knowledge and learning. In ANN training, the objective is to minimize the error over the training set. The most popular method for training these networks is back propagation, a gradient descent technique. Other non-linear optimization methods such as conjugate directions set or conjugate gradient have also been used for this purpose. Recently, metaheuristics such as simulated annealing, genetic algorithms or tabu search have been also adapted to this context.There are situations in which the necessary training data are being generated in real time and, an extensive tr…
Structural bias in population-based algorithms
2014
Abstract Challenging optimisation problems are abundant in all areas of science and industry. Since the 1950s, scientists have responded to this by developing ever-diversifying families of ‘black box’ optimisation algorithms. The latter are designed to be able to address any optimisation problem, requiring only that the quality of any candidate solution can be calculated via a ‘fitness function’ specific to the problem. For such algorithms to be successful, at least three properties are required: (i) an effective informed sampling strategy, that guides the generation of new candidates on the basis of the fitnesses and locations of previously visited candidates; (ii) mechanisms to ensure eff…
Shaping communities of local optima by perturbation strength
2017
Recent work discovered that fitness landscapes induced by Iterated Local Search (ILS) may consist of multiple clusters, denoted as funnels or communities of local optima. Such studies exist only for perturbation operators (kicks) with low strength. We examine how different strengths of the ILS perturbation operator affect the number and size of clusters. We present an empirical study based on local optima networks from NK fitness landscapes. Our results show that a properly selected perturbation strength can help overcome the effect of ILS getting trapped in clusters of local optima. This has implications for designing effective ILS approaches in practice, where traditionally only small per…
Path relinking and GRG for artificial neural networks
2006
Artificial neural networks (ANN) have been widely used for both classification and prediction. This paper is focused on the prediction problem in which an unknown function is approximated. ANNs can be viewed as models of real systems, built by tuning parameters known as weights. In training the net, the problem is to find the weights that optimize its performance (i.e., to minimize the error over the training set). Although the most popular method for training these networks is back propagation, other optimization methods such as tabu search or scatter search have been successfully applied to solve this problem. In this paper we propose a path relinking implementation to solve the neural ne…
Artificial Neural Networks to Predict the Power Output of a PV Panel
2014
The paper illustrates an adaptive approach based on different topologies of artificial neural networks (ANNs) for the power energy output forecasting of photovoltaic (PV) modules. The analysis of the PV module’s power output needed detailed local climate data, which was collected by a dedicated weather monitoring system. The Department of Energy, Information Engineering, and Mathematical Models of the University of Palermo (Italy) has built up a weather monitoring system that worked together with a data acquisition system. The power output forecast is obtained using three different types of ANNs: a one hidden layer Multilayer perceptron (MLP), a recursive neural network (RNN), and a gamma m…
Training Artificial Neural Networks With Improved Particle Swarm Optimization
2020
Particle Swarm Optimization (PSO) is popular for solving complex optimization problems. However, it easily traps in local minima. Authors modify the traditional PSO algorithm by adding an extra step called PSO-Shock. The PSO-Shock algorithm initiates similar to the PSO algorithm. Once it traps in a local minimum, it is detected by counting stall generations. When stall generation accumulates to a prespecified value, particles are perturbed. This helps particles to find better solutions than the current local minimum they found. The behavior of PSO-Shock algorithm is studied using a known: Schwefel's function. With promising performance on the Schwefel's function, PSO-Shock algorithm is util…
Hierarchical Evolutionary Algorithms and Noise Compensation via Adaptation
2007
Hierarchical Evolutionary Algorithms (HEAs) are Nested Algorithms composed by two or more Evolutionary Algorithms having the same fitness but different populations. More specifically, the fitness of a Higher Level Evolutionary Algorithm (HLEA) is the optimal fitness value returned by a Lower Level Evolutionary Algorithm (LLEA). Due to their algorithmic formulation, the HEAs can be efficiently implemented in Min-Max problems. In this chapter the application of the HEAs is shown for two different Min-Max problems in the field of Structural Optimization. These two problems are the optimal design of an electrical grounding grid and an elastic structure. Since the fitness of a HLEA is given by a…
The computational power of continuous time neural networks
1997
We investigate the computational power of continuous-time neural networks with Hopfield-type units. We prove that polynomial-size networks with saturated-linear response functions are at least as powerful as polynomially space-bounded Turing machines.